A Pipelined Pre-training Algorithm for DBNs

نویسندگان

  • Zhiqiang Ma
  • Tuya Li
  • Shuangtao Yang
  • Li Zhang
چکیده

Deep networks have been widely used in many domains in recent years. However, the pre-training of deep networks is time consuming with greedy layer-wise algorithm, and the scalability of this algorithm is greatly restricted by its inherently sequential nature where only one hidden layer can be trained at one time. In order to speed up the training of deep networks, this paper mainly focuses on pre-training phase and proposes a pipelined pre-training algorithm because it uses distributed cluster, which can significantly reduce the pre-training time at no loss of recognition accuracy. It’s more efficient than greedy layer-wise pre-training algorithm by using the computational cluster. The contrastive experiments between greedy layer-wise and pipelined layerwise algorithm are conducted finally, so we have carried out a comparative experiment on the greedy layer-wise algorithm and pipelined pre-training algorithms on the TIMIT corpus, result shows that the pipelined pre-training algorithm is an efficient algorithm to utilize distributed GPU cluster. We achieve a 2.84 and 5.9 speed-up with no loss of recognition accuracy when we use 4 slaves and 8 slaves. Parallelization efficiency is close to 0.73.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Hybrid architectures for speech recognition

The state-of-the-art automatic speech recognition (ASR) systems utilize a statistical pattern recognition framework called HMM/GMM (Hidden Markov Model / Gaussian Mixture Model) with short time spectral features such as Mel Frequency Cesptral Coefficients (MFCC) or Perceptual Linear Prediction (PLP). Although this approach has been shown to be effective in capturing speech patterns, recent perf...

متن کامل

Optimal fast digital error correction method of pipelined analog to digital converter with DLMS algorithm

In this paper, convergence rate of digital error correction algorithm in correction of capacitor mismatch error and finite and nonlinear gain of Op-Amp has increased significantly by the use of DLMS, an evolutionary search algorithm. To this end, a 16-bit pipelined analog to digital converter was modeled. The obtained digital model is a FIR filter with 16 adjustable weights. To adjust weights o...

متن کامل

Comparison and Combination of Multilayer Perceptrons and Deep Belief Networks in Hybrid Automatic Speech Recognition Systems

To improve the speech recognition performance, many ways to augment or combine HMMs (Hidden Markov Models) with other models to build hybrid architectures have been proposed. The hybrid HMM/ANN (Hidden Markov Model / Artificial Neural Network) architecture is one of the most successful approaches. In this hybrid model, ANNs (which are often multilayer perceptron neural networks MLPs) are used a...

متن کامل

Faster method for Deep Belief Network based Object classification using DWT

A Deep Belief Network (DBN) requires large, multiple hidden layers with high number of hidden units to learn good features from the raw pixels of large images. This implies more training time as well as computational complexity. By integrating DBN with Discrete Wavelet Transform (DWT), both training time and computational complexity can be reduced. The low resolution images obtained after appli...

متن کامل

The 16th Meeting on Image Recognition and Understanding High-frequency Restoration using Deep Belief Nets for Super-resolution

Super-resolution techniques are generally divided into two approaches: example-based methods and statistical methods. Example-based methods [1] simply use (or select in sparce coding [2]) pairs of low-resolution and high-resolution patches for the reconstruction. In this approach, a lowresolved input image is decomposed into patches, each of which is compared with the patches in the database an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017